Information content in Gaussian noise: optimal compression rates
نویسندگان
چکیده
We approach the theoretical problem of compressing a signal dominated by gaussian noise. We present accurate expressions for the compression ratio which can be reached under the light of Shannon’s noiseless coding theorem, for a linearly quantized stochastic gaussian signal (noise). The compression ratio decreases logarithmically with the amplitude of the frequency spectrum of the noise P (f). Further, we show how the entropy and the compression rate depend on the shape of this power spectrum, given different normalizations. The cases of white noise (w.n.), power-law noise fp –including 1/f noise–, (w.n. + 1/f) noise, and piecewise (w.n. + 1/f + 1/f) noise are discussed in detail, while quantitative behaviours and useful approximations are provided.
منابع مشابه
1 N ov 1 99 9 Information content in uniformly discretized Gaussian noise : optimal compres - sion rates
We approach the theoretical problem of compressing a signal dominated by Gaussian noise. We present expressions for the compression ratio which can be reached, under the light of Shannon’s noiseless coding theorem, for a linearly quantized stochastic Gaussian signal (noise). The compression ratio decreases logarithmically with the amplitude of the frequency spectrum P (f) of the noise. Entropy ...
متن کاملCombined compression and denoising of images using vector quantization
Compression of a noisy source is usually a two stage problem, involving the operations of estimation (denoising) and quantization. A survey of literature on this problem reveals that for the squared error distortion measure, the best possible compression strategy is to subject the noisy source to an optimal estimator followed by an optimal quantizer for the estimate. What we present in this pap...
متن کاملGaussian source coding with spherical codes
A fixed-rate shape–gain quantizer for the memoryless Gaussian source is proposed. The shape quantizer is constructed from wrapped spherical codes that map a sphere packing in 1 onto a sphere in , and the gain codebook is a globally optimal scalar quantizer. A wrapped Leech lattice shape quantizer is used to demonstrate a signal-to-quantization-noise ratio within 1 dB of the distortion-rate func...
متن کاملThe Reliability Function for the Additive White Gaussian Noise Channel at Rates above the Capacity
We consider the additive white Gaussian noise channels. We prove that the error probability of decoding tends to one exponentially for rates above the capacity and derive the optimal exponent function. We shall demonstrate that the information spectrum approach is quite useful for investigating this problem. Keywords—Additive white Gaussian noise channels, Strong converse theorem, Information s...
متن کاملSUBMITTED TO IEEE TRANSACTIONS ON NEURAL NETWORKS 1 Optimal Linear Compression Under UnreliableRepresentation and Robust PCA Neural
| In a typical linear data compression system the representation variables resulting from the coding operation are assumed totally reliable and therefore the solution in the mean-squared-error sense is an orthogonal projector to the so-called principal component subspace. When the representation variables are contaminated by additive noise which is uncorrelated with the signal, the problem is c...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998